18,173 research outputs found

    A glance beyond the quantum model

    Full text link
    One of the most important problems in Physics is how to reconcile Quantum Mechanics with General Relativity. Some authors have suggested that this may be realized at the expense of having to drop the quantum formalism in favor of a more general theory. However, as the experiments we can perform nowadays are far away from the range of energies where we may expect to observe non-quantum effects, it is difficult to theorize at this respect. Here we propose a fundamental axiom that we believe any reasonable post-quantum theory should satisfy, namely, that such a theory should recover classical physics in the macroscopic limit. We use this principle, together with the impossibility of instantaneous communication, to characterize the set of correlations that can arise between two distant observers. Although several quantum limits are recovered, our results suggest that quantum mechanics could be falsified by a Bell-type experiment if both observers have a sufficient number of detectors

    Biodiversity informatics: the challenge of linking data and the role of shared identifiers

    Get PDF
    A major challenge facing biodiversity informatics is integrating data stored in widely distributed databases. Initial efforts have relied on taxonomic names as the shared identifier linking records in different databases. However, taxonomic names have limitations as identifiers, being neither stable nor globally unique, and the pace of molecular taxonomic and phylogenetic research means that a lot of information in public sequence databases is not linked to formal taxonomic names. This review explores the use of other identifiers, such as specimen codes and GenBank accession numbers, to link otherwise disconnected facts in different databases. The structure of these links can also be exploited using the PageRank algorithm to rank the results of searches on biodiversity databases. The key to rich integration is a commitment to deploy and reuse globally unique, shared identifiers (such as DOIs and LSIDs), and the implementation of services that link those identifiers

    Unifying Parsimonious Tree Reconciliation

    Full text link
    Evolution is a process that is influenced by various environmental factors, e.g. the interactions between different species, genes, and biogeographical properties. Hence, it is interesting to study the combined evolutionary history of multiple species, their genes, and the environment they live in. A common approach to address this research problem is to describe each individual evolution as a phylogenetic tree and construct a tree reconciliation which is parsimonious with respect to a given event model. Unfortunately, most of the previous approaches are designed only either for host-parasite systems, for gene tree/species tree reconciliation, or biogeography. Hence, a method is desirable, which addresses the general problem of mapping phylogenetic trees and covering all varieties of coevolving systems, including e.g., predator-prey and symbiotic relationships. To overcome this gap, we introduce a generalized cophylogenetic event model considering the combinatorial complete set of local coevolutionary events. We give a dynamic programming based heuristic for solving the maximum parsimony reconciliation problem in time O(n^2), for two phylogenies each with at most n leaves. Furthermore, we present an exact branch-and-bound algorithm which uses the results from the dynamic programming heuristic for discarding partial reconciliations. The approach has been implemented as a Java application which is freely available from http://pacosy.informatik.uni-leipzig.de/coresym.Comment: Peer-reviewed and presented as part of the 13th Workshop on Algorithms in Bioinformatics (WABI2013

    Further constraints on neutron star crustal properties in the low-mass X-ray binary 1RXS J180408.9−-342058

    Full text link
    We report on two new quiescent {\it XMM-Newton} observations (in addition to the earlier {\it Swift}/XRT and {\it XMM-Newton} coverage) of the cooling neutron star crust in the low-mass X-ray binary 1RXS J180408.9−-342058. Its crust was heated during the ∼\sim4.5 month accretion outburst of the source. From our quiescent observations, fitting the spectra with a neutron star atmosphere model, we found that the crust had cooled from ∼\sim 100 eV to ∼\sim73 eV from ∼\sim8 days to ∼\sim479 days after the end of its outburst. However, during the most recent observation, taken ∼\sim860 days after the end of the outburst, we found that the crust appeared not to have cooled further. This suggested that the crust had returned to thermal equilibrium with the neutron star core. We model the quiescent thermal evolution with the theoretical crustal cooling code NSCool and find that the source requires a shallow heat source, in addition to the standard deep crustal heating processes, contributing ∼\sim0.9 MeV per accreted nucleon during outburst to explain its observed temperature decay. Our high quality {\it XMM-Newton} data required an additional hard component to adequately fit the spectra. This slightly complicates our interpretation of the quiescent data of 1RXS J180408.9−-342058. The origin of this component is not fully understood.Comment: Accepted for publication by MNRA

    Classical and quantum general relativity: a new paradigm

    Get PDF
    We argue that recent developments in discretizations of classical and quantum gravity imply a new paradigm for doing research in these areas. The paradigm consists in discretizing the theory in such a way that the resulting discrete theory has no constraints. This solves many of the hard conceptual problems of quantum gravity. It also appears as a useful tool in some numerical simulations of interest in classical relativity. We outline some of the salient aspects and results of this new framework.Comment: 8 pages, one figure, fifth prize of the Gravity Research Foundation 2005 essay competitio

    Complex Instantons and Charged Rotating Black Hole Pair Creation

    Get PDF
    We consider the general process of pair-creation of charged rotating black holes. We find that instantons which describe this process are necessarily complex due to regularity requirements. However their associated probabilities are real, and fully consistent with the interpretation that the entropy of a charged rotating black hole is the logarithm of the number of its quantum states.Comment: 11 pages, 1 figure, Latex, text shortened with only minor changes in content, accepted for Phys Rev Letter

    Can black holes and naked singularities be detected in accelerators?

    Get PDF
    We study the conditions for the existence of black holes that can be produced in colliders at TeV-scale if the space-time is higher dimensional. On employing the microcanonical picture, we find that their life-times strongly depend on the details of the model. If the extra dimensions are compact (ADD model), microcanonical deviations from thermality are in general significant near the fundamental TeV mass and tiny black holes decay more slowly than predicted by the canonical expression, but still fast enough to disappear almost instantaneously. However, with one warped extra dimension (RS model), microcanonical corrections are much larger and tiny black holes appear to be (meta)stable. Further, if the total charge is not zero, we argue that naked singularities do not occur provided the electromagnetic field is strictly confined on an infinitely thin brane. However, they might be produced in colliders if the effective thickness of the brane is of the order of the fundamental length scale (~1/TeV).Comment: 6 pages, RevTeX 3, 1 figure and 1 table, important changes and addition

    Cosmological Measures without Volume Weighting

    Full text link
    Many cosmologists (myself included) have advocated volume weighting for the cosmological measure problem, weighting spatial hypersurfaces by their volume. However, this often leads to the Boltzmann brain problem, that almost all observations would be by momentary Boltzmann brains that arise very briefly as quantum fluctuations in the late universe when it has expanded to a huge size, so that our observations (too ordered for Boltzmann brains) would be highly atypical and unlikely. Here it is suggested that volume weighting may be a mistake. Volume averaging is advocated as an alternative. One consequence may be a loss of the argument that eternal inflation gives a nonzero probability that our universe now has infinite volume.Comment: 15 pages, LaTeX, added references for constant-H hypersurfaces and also an idea for minimal-flux hypersurface

    Fundamental decoherence from relational time in discrete quantum gravity: Galilean covariance

    Get PDF
    We have recently argued that if one introduces a relational time in quantum mechanics and quantum gravity, the resulting quantum theory is such that pure states evolve into mixed states. The rate at which states decohere depends on the energy of the states. There is therefore the question of how this can be reconciled with Galilean invariance. More generally, since the relational description is based on objects that are not Dirac observables, the issue of covariance is of importance in the formalism as a whole. In this note we work out an explicit example of a totally constrained, generally covariant system of non-relativistic particles that shows that the formula for the relational conditional probability is a Galilean scalar and therefore the decoherence rate is invariant.Comment: 10 pages, RevTe
    • …
    corecore